Software Development
Deep Learning for Natural Language Processing
Deep Learning for NLP: GitHub Bug Prediction Analysis
Deep Learning for NLP: Introduction
Deep Learning for NLP: Memory-based Networks
Deep Learning for NLP: Neural Network Architectures
Deep Learning for NLP: Transfer Learning

Deep Learning for NLP: GitHub Bug Prediction Analysis

Course Number:
it_nldlnpdj_05_enus
Lesson Objectives

Deep Learning for NLP: GitHub Bug Prediction Analysis

  • discover the key concepts covered in this course
  • explain GitHub bug data and problem statement
  • perform library loading, data loading, and basic overview of columns
  • read a CSV file in Google Colab
  • demonstrate Exploratory Data Analysis (EDA) - word count analysis and label analysis
  • demonstrate EDA - punctuation analysis, stop word analysis, and word cloud
  • clean and preprocess data using advanced techniques
  • clean data using functions
  • use counter vector and term frequency-inverse document frequency (TFIDF) vectorization methods with visualizations
  • perform advanced embeddings like Word2Vec and apply AdaBoost classifier
  • work with deep learning models using embeddings
  • compare and contrast logistic regression, random forest, AdaBoost, and long short-term memory (LSTM) classifiers
  • summarize the key concepts covered in this course

Overview/Description
Get down to solving real-world GitHub bug prediction problems in this case study course. Examine the process of data and library loading and perform basic exploratory data analysis (EDA) including word count, label, punctuation, and stop word analysis. Explore how to clean and preprocess data in order to use vectorization and embeddings and use counter vector and term frequency-inverse document frequency (TFIDF) vectorization methods with visualizations. Finally, assess different classifiers like logistic regression, random forest, or AdaBoost. Upon completing this course, you will understand how to solve industry-level problems using deep learning methodology in the TensorFlow ecosystem.

Target

Prerequisites: none

Deep Learning for NLP: Introduction

Course Number:
it_nldlnpdj_01_enus
Lesson Objectives

Deep Learning for NLP: Introduction

  • discover the key concepts covered in this course
  • recall basic concepts of natural language processing (NLP) with deep learning (DL)
  • illustrate various use cases in NLP across different industries
  • outline the basic concepts of spaCy and TensorFlow
  • outline the basic concepts of Keras and PyTorch
  • outline the basic concepts of Open Neural Machine Translation (OpenNMT) and DeepNL
  • define basic concepts of sentiment data
  • explore the end-to-end components for a natural language processing (NLP) sentiment dataset
  • illustrate the basics of data loading and columns
  • demonstrate exploratory data analysis (EDA) of sentiment data
  • demonstrate pre-processing and feature engineering of sentiment data
  • demonstrate simple machine learning (ML) modeling, tuning, and evaluation using Keras
  • demonstrate creating accuracy graphs and graphs for loss over time
  • summarize the key concepts covered in this course

Overview/Description
In recent times, natural language processing (NLP) has seen many advancements, most of which are in deep learning models. NLP as a problem is very complicated, and deep learning models can handle that scale and complication with many different variations of neural network architecture. Deep learning also has a broad spectrum of frameworks that supports NLP problem solving out-of-the-box. Explore the basics of deep learning and different architectures for NLP-specific problems. Examine other use cases for deep learning NLP across industries. Learn about various tools and frameworks used such as - Spacy, TensorFlow, PyTorch, OpenNMT, etc. Investigate sentiment analysis and explore how to solve a problem using various deep learning steps and frameworks. Upon completing this course, you will be able to use the essential fundamentals of deep learning for NLP and outline its various industry use cases, frameworks, and fundamental sentiment analysis problems.

Target

Prerequisites: none

Deep Learning for NLP: Memory-based Networks

Course Number:
it_nldlnpdj_03_enus
Lesson Objectives

Deep Learning for NLP: Memory-based Networks

  • discover the key concepts covered in this course
  • outline the importance of memory-based learning and the different networks it supports
  • outline gated recurrent unit (GRU) and how it differs from recurrent neural networks (RNNs)
  • outline long short-term memory (LSTM) networks and how they differ from RNN
  • illustrate how LSTM networks work better and solve the vanishing gradient problem
  • illustrate different types of LSTM networks
  • perform data preparation for LSTM and GRU networks
  • perform review classification using GRU
  • perform review classification using LSTM
  • perform review classification using bidirectional long short-term memory (Bi-LSTM)
  • compare results of important features across different networks
  • summarize the key concepts covered in this course

Overview/Description
In the journey to understand deep learning models for natural language processing (NLP), the subsequent iterations are memory-based networks, which are much more capable of handling extended context in languages. While basic neural networks are better than machine learning (ML) models, they still lack in more significant and large language data problems. In this course, you will learn about memory-based networks like gated recurrent unit (GRU) and long short-term memory (LSTM). Explore their architectures, variants, and where they work and fail for NLP. Then, consider their implementations using product classification data and compare different results to understand each architecture's effectiveness. Upon completing this course, you will have learned the basics of memory-based networks and their implementation in TensorFlow to understand the effect of memory and more extended context for NLP datasets.

Target

Prerequisites: none

Deep Learning for NLP: Neural Network Architectures

Course Number:
it_nldlnpdj_02_enus
Lesson Objectives

Deep Learning for NLP: Neural Network Architectures

  • discover the key concepts covered in this course
  • illustrate single layer perceptron architecture of a neural network
  • illustrate MLP Architecture of Neural Network
  • describe RNN Architecture and how it can capture context in language
  • describe the various challenges of RNN
  • illustrate different applications of basic Neural Network-based architecture
  • describe the Amazon Product Reviews dataset and list the libraries that are required to be imported
  • describe the steps to load the Amazon Product Reviews dataset into Google Colaboratory
  • explore the data and its distribution in the Amazon Product Reviews dataset
  • analyze the product review data using pandas, graphs, and charts
  • describe the steps involved in pre-processing the product review dataset
  • illustrate word representations using one-hot encodings
  • illustrate word vector representations using neural network and Word2vec
  • create average feature vectors of all the words in the word vector
  • create word embeddings vector using Word2vec
  • construct a RNN model with Word2Vec Embeddings
  • illustrate sentence vector representations using GloVe vectors
  • perform classification of product review data using RNN
  • summarize the key concepts covered in this course

Overview/Description
Natural language processing (NLP) is constantly evolving with cutting edge advancements in tools and approaches. Neural network architecture (NNA) supports this evolution by providing a method of processing language-based information to solve complex data-driven problems. Explore the basic NNAs relevant to NLP problems. Learn different challenges and use cases for single-layer perceptron, multi-layer perceptron, and RNNs. Analyze data and its distribution using pandas, graphs, and charts. Examine word vector representations using one-hot encodings, Word2vec, and GloVe and classify data using recurrent neural networks. After you have completed this course, you will be able to use a product classification dataset to implement neural networks for NLP problems.

Target

Prerequisites: none

Deep Learning for NLP: Transfer Learning

Course Number:
it_nldlnpdj_04_enus
Lesson Objectives

Deep Learning for NLP: Transfer Learning

  • discover the key concepts covered in this course
  • define transfer learning and illustrate how it helps to get better results
  • outline advantages and challenges of transfer learning in real world problem solving
  • illustrate the use of language modeling in Transfer learning
  • outline key concepts related to FastText and Word2Vec
  • outline key concepts related to ELMo
  • outline key concepts realted to ULMFiT
  • build a ELMo embedding layer for product reviews data classification
  • create an ELMo model for product reviews data classification
  • perform review classification using ELMo and FastText
  • reshape data to adjust to ELMo embedding layer requirements
  • build a simple language model using ULMFiT on the product reviews data
  • implement and fine tune the LM model using ULMFiT
  • perform review classification using ULMFIT and FastText
  • illustrate model comparison
  • summarize the key concepts covered in this course

Overview/Description
The essential aspect of human intelligence is our learning processes, constantly augmented with the transfer of concepts and fundamentals. For example, as a child, we learn the basic alphabet, grammar, and words, and through the transfer of these fundamentals, we can then read books and communicate with people. This is what transfer learning helps us achieve in deep learning as well. This course will help you learn the fundamentals of transfer learning for NLP, its various challenges, and use cases. Explore various transfer learning models such as ELMo and ULMFiT. Upon completing this course, you will understand the transfer learning methodology of solving NLP problems and be able to experiment with various models in TensorFlow.

Target

Prerequisites: none

Close Chat Live